13 research outputs found

    Machine-learning methods for weak lensing analysis of the ESA Euclid sky survey

    Get PDF
    A clear picture has emerged from the last three decades of research: our Universe is expanding at an accelerated rate. The cause of this expansion remains elusive, but in essence acts as a repulsive force. This so-called dark energy represents about 69% of the energy content in the Universe. A further 26% of the energy is contained in dark matter, a form of matter that is invisible electromagnetically. Understanding the nature of these two major components of the Universe is at the top of the list of unsolved problems. To unveil answers, ambitious experiments are devised to survey an ever larger and deeper fraction of the sky. One such project is the European Space Agency (ESA) telescope Euclid, which will probe dark matter and infer desperately needed information about dark energy. Because light bundles follow null geodesics, their trajectories are affected by the mass distribution along the line of sight, which includes dark matter. This is gravitational lensing. In the vast majority of cases, deformations of the source objects are weak, and profiles are slightly sheared. The nature of the dark components can be fathomed by measuring the shear over a large fraction of the sky. The shear can be recovered by a statistical analysis of a large number of objects. In this thesis, we take on the development of the necessary tools to measure the shear. Shear measurement techniques have been developed and improved for more than two decades. Their performance, however, do not meet the unprecedented requirements imposed by future surveys. Requirements trickle down from the targeted determination of the cosmological parameters. We aim at preparing novel and innovative methods. These methods are tested against the Euclid requirements. Contributions can be classified into two major themes. A key step in the processing of weak gravitational lensing data is the correction of image deformations generated by the instrument itself. This point spread function (PSF) correction is the first theme. The second is the shear measurement itself, and in particular, producing accurate measurements. We explore machine-learning methods, and notably artificial neural networks. These methods are, for the most part, data-driven. Schemes must first be trained against a representative sample of data. Crafting optimal training sets and choosing the method parameters can be crucial for the performance. We dedicate an important fraction of this dissertation to describing simulations behind the datasets and motivating our parameter choices. We propose schemes to build a clean selection of stars and model the PSF to the Euclid requirements in the first part of this thesis. Shear measurements are notoriously biased because of their small size and their low intensity. We introduce an approach that produces unbiased estimates of shear. This is achieved by processing data from any shape measurement technique with artificial neural networks, and predicting corrected estimates of the shape of the galaxies, or directly the shear. We demonstrate that simple networks with simple trainings are sufficient to reach the Euclid requirements on shear measurements

    Weak-lensing shear measurement with machine learning: teaching artificial neural networks about feature noise

    Full text link
    Cosmic shear is a primary cosmological probe for several present and upcoming surveys investigating dark matter and dark energy, such as Euclid or WFIRST. The probe requires an extremely accurate measurement of the shapes of millions of galaxies based on imaging data. Crucially, the shear measurement must address and compensate for a range of interwoven nuisance effects related to the instrument optics and detector, noise, unknown galaxy morphologies, colors, blending of sources, and selection effects. This paper explores the use of supervised machine learning (ML) as a tool to solve this inverse problem. We present a simple architecture that learns to regress shear point estimates and weights via shallow artificial neural networks. The networks are trained on simulations of the forward observing process, and take combinations of moments of the galaxy images as inputs. A challenging peculiarity of this ML application is the combination of the noisiness of the input features and the requirements on the accuracy of the inverse regression. To address this issue, the proposed training algorithm minimizes bias over multiple realizations of individual source galaxies, reducing the sensitivity to properties of the overall sample of source galaxies. Importantly, an observational selection function of these source galaxies can be straightforwardly taken into account via the weights. We first introduce key aspects of our approach using toy-model simulations, and then demonstrate its potential on images mimicking Euclid data. Finally, we analyze images from the GREAT3 challenge, obtaining competitively low shear biases despite the use of a simple training set. We conclude that the further development of ML approaches is of high interest to meet the stringent requirements on the shear measurement in current and future surveys. A demonstration implementation of our technique is publicly available.Comment: 31 pages, 26 figures, minor changes to match the version published in A&A, code available at https://astro.uni-bonn.de/~mtewes/ml-shear-meas

    Development of a data-driven method for assessing health and welfare in the most common livestock species in Switzerland: The Smart Animal Health project.

    Get PDF
    Improving animal health and welfare in livestock systems depends on reliable proxies for assessment and monitoring. The aim of this project was to develop a novel method that relies on animal-based indicators and data-driven metrics for assessing health and welfare at farm level for the most common livestock species in Switzerland. Method development followed a uniform multi-stage process for each species. Scientific literature was systematically reviewed to identify potential health and welfare indicators for cattle, sheep, goats, pigs and poultry. Suitable indicators were applied in the field and compared with outcomes of the Welfare Quality® scores of a given farm. To identify farms at risk for violations of animal welfare regulations, several agricultural and animal health databases were interconnected and various supervised machine-learning techniques were applied to model the status of farms. Literature reviews identified a variety of indicators, some of which are well established, while others lack reliability or practicability, or still need further validation. Data quality and availability strongly varied among animal species, with most data available for dairy cows and pigs. Data-based indicators were almost exclusively limited to the categories "Animal health" and "Husbandry and feeding". The assessment of "Appropriate behavior" and "Freedom from pain, suffering, harm and anxiety" depended largely on indicators that had to be assessed and monitored on-farm. The different machine-learning techniques used to identify farms for risk-based animal welfare inspections reached similar classification performances with sensitivities above 80%. Features with the highest predictive weights were: Participation in federal ecological and animal welfare programs, farm demographics and farmers' notification discipline for animal movements. A common method with individual sets of indicators for each species was developed. The results show that, depending on data availability for the individual animal categories, models based on proxy data can achieve high correlations with animal health and welfare assessed on-farm. Nevertheless, for sufficient validity, a combination of data-based indicators and on-farm assessments is currently required. For a broad implementation of the methods, alternatives to extensive manual on-farm assessments are needed, whereby smart farming technologies have great potential to support the assessment if the specific monitoring goals are defined

    GREAT3 results I: systematic errors in shear estimation and the impact of real galaxy morphology

    Get PDF
    We present first results from the third GRavitational lEnsing Accuracy Testing (GREAT3) challenge, the third in a sequence of challenges for testing methods of inferring weak gravitational lensing shear distortions from simulated galaxy images. GREAT3 was divided into experiments to test three specific questions, and included simulated space- and ground-based data with constant or cosmologically-varying shear fields. The simplest (control) experiment included parametric galaxies with a realistic distribution of signal-to-noise, size, and ellipticity, and a complex point spread function (PSF). The other experiments tested the additional impact of realistic galaxy morphology, multiple exposure imaging, and the uncertainty about a spatially-varying PSF; the last two questions will be explored in Paper II. The 24 participating teams competed to estimate lensing shears to within systematic error tolerances for upcoming Stage-IV dark energy surveys, making 1525 submissions overall. GREAT3 saw considerable variety and innovation in the types of methods applied. Several teams now meet or exceed the targets in many of the tests conducted (to within the statistical errors). We conclude that the presence of realistic galaxy morphology in simulations changes shear calibration biases by 1\sim 1 per cent for a wide range of methods. Other effects such as truncation biases due to finite galaxy postage stamps, and the impact of galaxy type as measured by the S\'{e}rsic index, are quantified for the first time. Our results generalize previous studies regarding sensitivities to galaxy size and signal-to-noise, and to PSF properties such as seeing and defocus. Almost all methods' results support the simple model in which additive shear biases depend linearly on PSF ellipticity.Comment: 32 pages + 15 pages of technical appendices; 28 figures; submitted to MNRAS; latest version has minor updates in presentation of 4 figures, no changes in content or conclusion

    GREAT3 results - I. Systematic errors in shear estimation and the impact of real galaxy morphology

    Get PDF
    We present first results from the third GRavitational lEnsing Accuracy Testing (GREAT3) challenge, the third in a sequence of challenges for testing methods of inferring weak gravitational lensing shear distortions from simulated galaxy images. GREAT3 was divided into experiments to test three specific questions, and included simulated space- and ground-based data with constant or cosmologically varying shear fields. The simplest (control) experiment included parametric galaxies with a realistic distribution of signal-to-noise, size, and ellipticity, and a complex point spread function (PSF). The other experiments tested the additional impact of realistic galaxy morphology, multiple exposure imaging, and the uncertainty about a spatially varying PSF; the last two questions will be explored in Paper II. The 24 participating teams competed to estimate lensing shears to within systematic error tolerances for upcoming Stage-IV dark energy surveys, making 1525 submissions overall. GREAT3 saw considerable variety and innovation in the types of methods applied. Several teams now meet or exceed the targets in many of the tests conducted (to within the statistical errors). We conclude that the presence of realistic galaxy morphology in simulations changes shear calibration biases by ∼1percent for a wide range of methods. Other effects such as truncation biases due to finite galaxy postage stamps, and the impact of galaxy type as measured by the Sérsic index, are quantified for the first time. Our results generalize previous studies regarding sensitivities to galaxy size and signal-to-noise, and to PSF properties such as seeing and defocus. Almost all methods' results support the simple model in which additive shear biases depend linearly on PSF ellipticit

    The CHEOPS mission

    Full text link
    The CHaracterising ExOPlanet Satellite (CHEOPS) was selected in 2012, as the first small mission in the ESA Science Programme and successfully launched in December 2019. CHEOPS is a partnership between ESA and Switzerland with important contributions by ten additional ESA Member States. CHEOPS is the first mission dedicated to search for transits of exoplanets using ultrahigh precision photometry on bright stars already known to host planets. As a follow-up mission, CHEOPS is mainly dedicated to improving, whenever possible, existing radii measurements or provide first accurate measurements for a subset of those planets for which the mass has already been estimated from ground-based spectroscopic surveys and to following phase curves. CHEOPS will provide prime targets for future spectroscopic atmospheric characterisation. Requirements on the photometric precision and stability have been derived for stars with magnitudes ranging from 6 to 12 in the V band. In particular, CHEOPS shall be able to detect Earth-size planets transiting G5 dwarf stars in the magnitude range between 6 and 9 by achieving a photometric precision of 20 ppm in 6 hours of integration. For K stars in the magnitude range between 9 and 12, CHEOPS shall be able to detect transiting Neptune-size planets achieving a photometric precision of 85 ppm in 3 hours of integration. This is achieved by using a single, frame-transfer, back-illuminated CCD detector at the focal plane assembly of a 33.5 cm diameter telescope. The 280 kg spacecraft has a pointing accuracy of about 1 arcsec rms and orbits on a sun-synchronous dusk-dawn orbit at 700 km altitude. The nominal mission lifetime is 3.5 years. During this period, 20% of the observing time is available to the community through a yearly call and a discretionary time programme managed by ESA.Comment: Submitted to Experimental Astronom

    SALSA: a tool to estimate the stray light contamination for low-Earth orbit observatories

    No full text
    Stray light contamination reduces considerably the precision of photometric of faint stars for low altitude space-borne observatories. When measuring faint objects, the necessity of coping with stray light contamination arises in order to avoid systematic impacts on low signal-to-noise images. Stray light contamination can be represented by a flat offset in CCD data. Mitigation techniques begin by a comprehensive study during the design phase, followed by the use of target pointing optimisation and post-processing methods. We present a code that aims at simulating the stray-light contamination in low-Earth orbit coming from reflexion of solar light by the Earth. StrAy Light SimulAtor (SALSA) is a tool intended to be used at an early stage as a tool to evaluate the effective visible region in the sky and, therefore to optimise the observation sequence. SALSA can compute Earth stray light contamination for significant periods of time allowing mission-wide parameters to be optimised (e.g. impose constraints on the point source transmission function (PST) and/or on the altitude of the satellite). It can also be used to study the behaviour of the stray light at different seasons or latitudes. Given the position of the satellite with respect to the Earth and the Sun, SALSA computes the stray light at the entrance of the telescope following a geometrical technique. After characterising the illuminated region of the Earth, the portion of illuminated Earth that affects the satellite is calculated. Then, the flux of reflected solar photons is evaluated at the entrance of the telescope. Using the PST of the instrument, the final stray light contamination at the detector is calculated. The analysis tools include time series analysis of the contamination, evaluation of the sky coverage and an objects visibility predictor. Effects of the South Atlantic Anomaly and of any shutdown periods of the instrument can be added. Several designs or mission concepts can be easily tested and compared. The code is not thought as a stand-alone mission designer. Its mandatory inputs are a time series describing the trajectory of the satellite and the characteristics of the instrument. This software suite has been applied to the design and analysis of CHEOPS (CHaracterizing ExoPlanet Satellite). This mission requires very high precision photometry to detect very shallow transits of exoplanets. Different altitudes and characteristics of the detector have been studied in order to find the best parameters, that reduce the effect of contamination

    SALSA: a tool to estimate the stray light contamination for low-Earth orbit observatories

    No full text
    Stray light contamination reduces considerably the precision of photometric of faint stars for low altitude spaceborne observatories. When measuring faint objects, the necessity of coping with stray light contamination arises in order to avoid systematic impacts on low signal-to-noise images. Stray light contamination can be represented by a flat offset in CCD data. Mitigation techniques begin by a comprehensive study during the design phase, followed by the use of target pointing optimisation and post-processing methods. We present a code that aims at simulating the stray-light contamination in low-Earth orbit coming from reflexion of solar light by the Earth. StrAy Light SimulAtor (SALSA) is a tool intended to be used at an early stage as a tool to evaluate the effective visible region in the sky and, therefore to optimise the observation sequence. SALSA can compute Earth stray light contamination for significant periods of time allowing missionwide parameters to be optimised (e.g. impose constraints on the point source transmission function (PST) and/or on the altitude of the satellite). It can also be used to study the behaviour of the stray light at different seasons or latitudes. Given the position of the satellite with respect to the Earth and the Sun, SALSA computes the stray light at the entrance of the telescope following a geometrical technique. After characterising the illuminated region of the Earth, the portion of illuminated Earth that affects the satellite is calculated. Then, the flux of reflected solar photons is evaluated at the entrance of the telescope. Using the PST of the instrument, the final stray light contamination at the detector is calculated. The analysis tools include time series analysis of the contamination, evaluation of the sky coverage and an objects visibility predictor. Effects of the South Atlantic Anomaly and of any shutdown periods of the instrument can be added. Several designs or mission concepts can be easily tested and compared. The code is not thought as a stand-alone mission designer. Its mandatory inputs are a time series describing the trajectory of the satellite and the characteristics of the instrument. This software suite has been applied to the design and analysis of CHEOPS (CHaracterizing ExOPlanet Satellite). This mission requires very high precision photometry to detect very shallow transits of exoplanets. Different altitudes and characteristics of the detector have been studied in order to find the best parameters, that reduce the effect of contamination. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only
    corecore